Capacity, Mutual Information, and Coding for Finite-state Markov Channels I Introduction Ii Channel Model
نویسندگان
چکیده
The Finite-State Markov Channel (FSMC) is a discrete-time varying channel whose variation is determined by a nite-state Markov process. These channels have memory due to the Markov channel variation. We obtain the FSMC capacity as a function of the conditional channel state probability. We also show that for i.i.d. channel inputs, this conditional probability converges weakly, and the channel's mutual information is then a closed-form continuous function of the input distribution. We next consider coding for FSMCs. In general, the complexity of maximum-likelihood decoding grows exponentially with the channel memory length. Therefore, in practice, interleaving and memoryless channel codes are used. This technique results in some performance loss relative to the inherent capacity of channels with memory. We propose a maximum-likelihood decision-feedback decoder with complexity that is independent of the channel memory. We calculate the capacity and cutoo rate of our technique, and show that it preserves the capacity of certain FSMCs. We also compare the performance of the decision-feedback decoder with that of interleaving and memoryless channel coding on a fading channel with 4PSK modulation. This paper extends the capacity and coding results of M. Mushkin and I. Bar-David 1] for the Gilbert-Elliot channel to a more general time-varying channel model. The Gilbert-Elliot channel is a stationary two-state Markov chain, where each state is a binary symmetric channel (BSC), as in Figure 1. The transition probabilities between states are g and b respectively, and the crossover probabilities for the \good" and \bad" BSCs are p G and p B respectively, where p G < p B. Let x n 2 f0; 1g, y n 2 f0; 1g, and z n = x n y n denote respectively the channel input, channel output, and channel error on the nth transmission. In 1], the capacity of the Gilbert-Elliot channel is derived as (1) where h is the entropy function, q n = p(z n = 1jz n?1), q n converges to q 1 in distribution, and q 1 is independent of the initial channel state. In this paper we derive the capacity of a more general nite-state Markov channel, where the channel states are not necessarily BSCs. We model the channel as a Markov chain S n which takes values in a nite state space C of memoryless channels with nite input and output alphabets. The conditional input/output probability is thus p(y n jx n ; S n), where x n …
منابع مشابه
Capacity, mutual information, and coding for finite-state Markov channels
The Finite-State Markov Channel (FSMC) is a discrete time-varying channel whose variation is determined by a finite-state Markov process. These channels have memory due to the Markov channel variation. We obtain the FSMC capacity as a function of the conditional channel state probability. We also show that for i.i.d. channel inputs, this conditional probability converges weakly, and the channel...
متن کاملCapacity, Mutual Information, and Coding for Finite-State Markov Channels - Information Theory, IEEE Transactions on
AbstructThe Finite-State Markov Channel (FSMC) is a discrete time-varying channel whose variation is determined by a finite-state Markov process. These channels have memory due to the Markov channel variation. We obtain the FSMC capacity as a function of the conditional channel state probability. We also show that for i.i.d. channel inputs, this conditional probability converges weakly, and the...
متن کاملOn Entropy and Lyapunov Exponents for Finite-State Channels
The Finite-State Markov Channel (FSMC) is a time-varying channel having states that are characterized by a finite-state Markov chain. These channels have infinite memory, which complicates their capacity analysis. We develop a new method to characterize the capacity of these channels based on Lyapunov exponents. Specifically, we show that the input, output, and conditional entropies for this ch...
متن کاملEntropy and Mutual Information for Markov Channels with General Inputs
We study new formulas based on Lyapunov exponents for entropy, mutual information, and capacity of finite state discrete time Markov channels. We also develop a method for directly computing mutual information and entropy using continuous state space Markov chains. Our methods allow for arbitrary input processes and channel dynamics, provided both have finite memory. We show that the entropy ra...
متن کاملOn the capacity of Markov sources over noisy channels
We present an expectation-maximization method for optimizing Markov process transition probabilities to increase the mutual information rate achievable when the Markov process is transmitted over a noisy finitestate machine channel. The method provides a tight lower bound on the achievable information rate of a Markov process over a noisy channel and it is conjectured that it actually maximizes...
متن کامل